Goto

Collaborating Authors

 link grammar


On Unsupervised Training of Link Grammar Based Language Models

Mikhaylovskiy, Nikolay

arXiv.org Artificial Intelligence

In this short note we explore what is needed for unsupervised training of graph language models based on link grammars. First, we introduce the termination tags formalism required to build a language model based on a link grammar formalism of Sleator and Temperley [21] and discuss the influence of context on the unsupervised learning of link grammars. Second, we propose a statistical link grammar formalism, allowing for statistical language generation. Third, based on the above formalism, we show that the classical dissertation of Yuret [25] on discovery of linguistic relations using lexical attraction ignores contextual properties of the language, and thus the approach to unsupervised language learning relying just on bigrams is flawed. This correlates well with the unimpressive results in unsupervised training of graph language models based on bigram approach of Yuret.


Natural Language Generation Using Link Grammar for General Conversational Intelligence

Ramesh, Vignav, Kolonin, Anton

arXiv.org Artificial Intelligence

Many current artificial general intelligence (AGI) and natural language processing (NLP) architectures do not possess general conversational intelligence--that is, they either do not deal with language or are unable to convey knowledge in a form similar to the human language without manual, labor-intensive methods such as template-based customization. In this paper, we propose a new technique to automatically generate grammatically valid sentences using the Link Grammar database. This natural language generation method far outperforms current state-of-the-art baselines and may serve as the final component in a proto-AGI question answering pipeline that understandably handles natural language material.


Parsing English with a link grammar

Sleator, D. | Temperley, D.

Classics

In just 3 minutes, help us better understand how you perceive arXiv. We gratefully acknowledge support from the Simons Foundation and member institutions.